500 research outputs found
Using Bad Learners to find Good Configurations
Finding the optimally performing configuration of a software system for a
given setting is often challenging. Recent approaches address this challenge by
learning performance models based on a sample set of configurations. However,
building an accurate performance model can be very expensive (and is often
infeasible in practice). The central insight of this paper is that exact
performance values (e.g. the response time of a software system) are not
required to rank configurations and to identify the optimal one. As shown by
our experiments, models that are cheap to learn but inaccurate (with respect to
the difference between actual and predicted performance) can still be used rank
configurations and hence find the optimal configuration. This novel
\emph{rank-based approach} allows us to significantly reduce the cost (in terms
of number of measurements of sample configuration) as well as the time required
to build models. We evaluate our approach with 21 scenarios based on 9 software
systems and demonstrate that our approach is beneficial in 16 scenarios; for
the remaining 5 scenarios, an accurate model can be built by using very few
samples anyway, without the need for a rank-based approach.Comment: 11 pages, 11 figure
Resource use during systematic review production varies widely: a scoping review
Objective: We aimed to map the resource use during systematic review (SR) production and reasons why steps of the SR production
are resource intensive to discover where the largest gain in improving efficiency might be possible.
Study design and setting: We conducted a scoping review. An information specialist searched multiple databases (e.g., Ovid
MEDLINE, Scopus) and implemented citation-based and grey literature searching. We employed dual and independent screenings of
records at the title/abstract and full-text levels and data extraction.
Results: We included 34 studies. Thirty-two reported on the resource useâmostly time; four described reasons why steps of
the review process are resource intensive. Study selection, data extraction, and critical appraisal seem to be very resource intensive, while protocol development, literature search, or study retrieval take less time. Project management and administration required a large
proportion of SR production time. Lack of experience, domain knowledge, use of collaborative and SR-tailored software, and good
communication and management can be reasons why SR steps are resource intensive.
Conclusion: Resource use during SR production varies widely. Areas with the largest resource use are administration and project
management, study selection, data extraction, and critical appraisal of studies.European Commission CA17117Danube University Krem
Heart Failure in the Elderly
Heart failure is a clinical syndrome with various causes for which no universally
accepted definition exists. Packer's definition of heart failure "representing a
complex clinical syndrome characterised by abnonnalities of left ventricular function
and neurohumoral regulation. which are accompanied by effort intolerance, fluid
retention and reduced longevity" reveals the complexity of the syndrome.
Heart failure is one of the commonest cardiovascular disorders in Western Society and a
growing major public health problem. It has been estimated that in the Netherlands the
number of hospital discharges for heart failure rose from 14441 in 1980 to 25966 in
1993. The prevalence of heart failure rises rapidly with age from 0.7% in those aged
55-64 to 13.0% in those aged 74-84.4 This indicates a rapidly expanding problem
mainly due to an increase in the number of elderly.
Despite the facl that heart failure and its precursor left ventricular systolic dysfunction
are increasingly being recognised as important causes for morbidity and mortality,
epidemiologic data are scarce. l For example, reliable information on the incidence of
the syndrome is very limited. One of the reasons of the lack of epidemiologic data on
heart failure is the difficulty of diagnosing early slages of heart failure and the virtual
absence of target cohort studies. In the Netherlands and in the UK most heart failure
patients are detected and treated in general practice. Heart failure is difficult to diagnose
by the general practitioner due to the unavailability of morc sophisticated or invasive
diagnostic tools and is primarily based on clinical judgement. In recent years
neurohumoral and Doppler echocardiographic measurements have emerged as noninvasive
tools that could aid in the diagnosis of heart failure, also in a non-hospital
setting.
Heart fallure carries a poor prognosis, but, again, data from population-based studies,
notably those addressing the prognostic implications of asymptomatic ventricular
dysfullction, is limited
Standard classification of expense
Adopted November 11th, 1920. The basic costs to be ascertained were warehouse, direct shipment and indirect shipment. The total expense of doing business includes all three of these factors. The Committee discussed at length the advisability of following the plan of some cost systems which start with the invoiced unit cost, and then load this with the burden of expense incurred during transit through the various processes involved in filling an order. In following such a procedure, overhead is naturally assessed on a basis of price per pound or some similar unit, but we found the units in a paper warehouse differed so in character that it would be impracticable to follow such a plan. For instance, it developed that some paper warehouses handled not only the various kinds of paper, but ice cream cones, butter dishes, clothes lines, matches, hammocks, automobile tires, etc. For this reason, and also because the majority of houses kept no record of sales and purchases on a tonnage basis, it was found impracticable to operate the system in its initial stages on a price per pound basis. Therefore the only easy, workable method was to operate on a percentage of sales, and the system was devised along this line. Original item in Box no. 040
Estimating the Costs of Foundational Public Health Capabilities: A Recommended Methodology
The Institute of Medicineâs 2012 report on public health financing recommended the convening of expert panels to identify the components and costs of a âminimum package of public health servicesâ that should be available in every U.S. community. The report recommended that this minimum package include a core set of public health programs that target specific, high-priority preventable health problems and risks, along with a set of âfoundational public health capabilitiesâ that are deemed necessary to support the successful implementation of public health programs and policies. In response to this recommendation, the Robert Wood Johnson Foundation, in collaboration with the US Centers for Disease Control and Prevention and other national professional associations, formed the Public Health Leadership Forum, an expert consensus panel process to identify a recommended set of core programs and foundational capabilities for the nation. The Forumâs initial charge focused on the specification of foundational public health capabilities. The Foundational Capabilities Workgroup was formed as a part of the Forum to identify and define the elements to be included as foundational capabilities for governmental public health agencies at both state and local levels.
The Robert Wood Johnson Foundation asked the National Coordinating Center for Public Health Services and Systems Research based at the University of Kentucky to convene a second expert panel workgroup, the Workgroup on Public Health Cost Estimation, to develop a methodology for estimating the resources required to develop and maintain foundational capabilities by governmental public health agencies at both state and local levels. Working in parallel with the Foundational Capabilities Workgroup, this Cost Estimation Workgroup has considered relevant cost-accounting models and cost estimation methodologies, and reviewed related cost estimation studies, in order to make recommendations on an approach for generating first-generation estimates of the costs associated with developing and maintaining foundational capabilities
Mind the GapâŠMind the Chasm : exploring inclusion and equity in Alaska's education system
publishedVersio
Estimating the incidence, prevalence and true cost of asthma in the UK: secondary analysis of national stand-alone and linked databases in England, Northern Ireland, Scotland and Wales-a study protocol.
INTRODUCTION: Asthma is now one of the most common long-term conditions in the UK. It is therefore important to develop a comprehensive appreciation of the healthcare and societal costs in order to inform decisions on care provision and planning. We plan to build on our earlier estimates of national prevalence and costs from asthma by filling the data gaps previously identified in relation to healthcare and broadening the field of enquiry to include societal costs. This work will provide the first UK-wide estimates of the costs of asthma. In the context of asthma for the UK and its member countries (ie, England, Northern Ireland, Scotland and Wales), we seek to: (1) produce a detailed overview of estimates of incidence, prevalence and healthcare utilisation; (2) estimate health and societal costs; (3) identify any remaining information gaps and explore the feasibility of filling these and (4) provide insights into future research that has the potential to inform changes in policy leading to the provision of more cost-effective care.
METHODS AND ANALYSIS: Secondary analyses of data from national health surveys, primary care, prescribing, emergency care, hospital, mortality and administrative data sources will be undertaken to estimate prevalence, healthcare utilisation and outcomes from asthma. Data linkages and economic modelling will be undertaken in an attempt to populate data gaps and estimate costs. Separate prevalence and cost estimates will be calculated for each of the UK-member countries and these will then be aggregated to generate UK-wide estimates.
ETHICS AND DISSEMINATION: Approvals have been obtained from the NHS Scotland Information Services Division's Privacy Advisory Committee, the Secure Anonymised Information Linkage Collaboration Review System, the NHS South-East Scotland Research Ethics Service and The University of Edinburgh's Centre for Population Health Sciences Research Ethics Committee. We will produce a report for Asthma-UK, submit papers to peer-reviewed journals and construct an interactive map
Recommended from our members
Helium release from type 304 stainless steel
Helium in very low concentration (less than 1 atomic ppB) has been introduced into type 304 stainless steel by radioactive decay of dissolved tritium. Release of this helium during subsequent annealing was monitored with a high sensitivity mass spectrometric gas analyzer. With isochronal annealing, helium is released in two temperature ranges, one near 300C and the other between 800C and the melting point. The latter release is interpreted as attributable to helium gas bubbles. The release near 300C was studied isothermally between 150 and 300C and is analyzed in terms of two stages of exponential decay. The fast and slow release stages have relaxation times near 10 and 10 s, respectively, and the fast release accounts for roughly 85 percent of the total release at low temperature. From an analysis of the temperature dependence of the release rate, it is concluded that volume diffusion is the controlling mechanism for the outgassing. (auth
Epidemiology of Aortic Aneurysm Repair in the United States from 1993 to 2003
The epidemiology of abdominal aortic aneurysm (AAA) disease has been well described over the preceding 50 years. This disease primarily affects elderly males with smoking, hypertension, and a positive family history contributing to an increased risk of aneurysm formation. The aging population as well as increased screening in high-risk populations has led some to suggest that the incidence of AAAs is increasing. The National Inpatient Sample (1993 2003), a national representative database, was used in this study to determine trends in mortality following AAA repair in the United States. In addition, the impact of the introduction of less invasive endovascular AAA repair was assessed. Overall rates of treated unruptured and ruptured AAAs remained stable (unruptured 12 to 15 100,000; ruptured 1 to 3 100,000). In 2003, 42.7 of unruptured and 8.8 of ruptured AAAs were repaired through an endovascular approach. Inhospital mortality following unruptured AAA repair continues to decline for open repair (5.3 to 4.7 , P 0.007). Mortality after elective endovascular AAA repair also has statistically decreased (2.1 to 1.0 , P 0.024) and remains lower than open repair. Mortality rates for ruptured AAAs following repair remain high (open: 46.5 to 40.7 , P 0.01; endovascular: 40.0 to 35.3 , P 0.823). These data suggest that the numbers of patients undergoing elective AAA repair have remained relatively stable despite the introduction of less invasive technology. A shift in the treatment paradigm is occurring with a higher percentage of patients subjected to elective endovascular AAA repair compared to open repair. This shift, at least in the short term, appears justified as the mortality in patients undergoing elective endovascular AAA repair is significantly reduced compared to patients undergoing open AAA repair.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/73855/1/annals.1383.030.pd
- âŠ